8 research outputs found

    Virtual Control and Synthesis of Music Performances: Qualitative Evaluation of Synthesized Timpani Exercises

    Get PDF
    International audienceThe increasing availability of software for creating real-time simulations of musical instrument sounds allows for the design of new visual and sounding media. These past decades have especially focused on the control of real and virtual instruments by natural gestures. In this paper, we present and extensively evaluate a framework (Figure 1) for the control of virtual percussion instruments, by modeling and simulating virtual percussionists gestures. By positioning the virtual performer at the center of the gesture-sound synthesis system, we aim at providing original tools to analyze and synthesize instrumental gesture performances. Our physics-based approach for gesture simulation brings some insight into the effect of biomechanical parameters of the gesture on the instrumental performance. Simulating both gesture and sound by physical models leads also to a coherent and human-centered interaction and provides new ways of exploring the mapping between gesture and sound. The use of motion capture data enables the realistic synthesis of both pre-recorded and novel percussion sequences from the specification of gesture scores. Such scores involve motion editing techniques applied to simple beat attacks. We therefore propose an original gesture language based on the instrumental playing techniques. This language is characterized by expressivity, interactivity with the user, and the possibility to take into account co-articulation between gesture units. Finally, providing 3D visual rendering synchronized with sound rendering allows us to observe virtual performances to the light of real ones, and to qualitatively evaluate both pedagogical and compositional capabilities of such a system
    corecore